Skip to main content

AIR-021

AIMB-294

AI Inference System Based on NVIDIA® Jetson Orin™ NX/Nano

  • A compact and high-performance AI Box
  • NVIDIA® Jetson Orin NX 16GB/8GB and Orin Nano 8GB/4GB modules
  • It supports a wide voltage range of 12 ~ 24V and a wide temperature range of -10 ~ 55 °C.
  • Multiple I/O interfaces: including LAN, DIO, COM, CANBus, USB 3.2, and MIPI interface via Type-C.
  • Multiple expansion interfaces: Supports M.2 B Key, M.2 E Key and SD card slot.
  • Supports Linux Ubuntu 22.04 LTS and JetPack SDK version 6.2
  • Supports GenAI Studio, which can be used for model fine-tuning and inference applications.

Edge AI SDK

DownloadSystemAI Packages & ToolsNote & Dependency
v3.5.0• OS: Ubuntu 22.04
• JetPack: 6.2
• RAM: 16GB
• TensorRT: 10.3.0
• CUDA: 12.5
• DeepStream: 7.1

VisionAI Demos:
  • Object Detection
  • Face Detection
  • Person Detection
  • Pose Estimation

GenAI Demos:
  • Chatbot
• Ubuntu 22.04 (kernel:5.15.148-tegra)

AI Application

Vision AI Apps

Gen AI Chatbot

Pre-trained Models

Monitoring

Benchmark

Utility

tegrastats

Jetson_benchmarks

AI SDK

TensorRT 10.3.0

CUDA 12.5

DeepStream 7.1

System

JetPack 6.2

OS / Kernel

Ubuntu-tegra

AI Software Stack on AIR-021 ( v3.5.0 )

Advantech
NVIDIA
Open Source

AI Accelerator

None


AI Runtime SDK

JetPack
DeepStream
Application


Optimization

Performance
Accelerator


Benchmark & Tools

jetson_benchmarks
Tegrastats_Utility


Documents

Datasheet